An Improved Conjugate Gradient Based Learning Algorithm for Back Propagation Neural Networks
نویسندگان
چکیده
The conjugate gradient optimization algorithm is combined with the modified back propagation algorithm to yield a computationally efficient algorithm for training multilayer perceptron (MLP) networks (CGFR/AG). The computational efficiency is enhanced by adaptively modifying initial search direction as described in the following steps: (1) Modification on standard back propagation algorithm by introducing a gain variation term in the activation function, (2) Calculation of the gradient descent of error with respect to the weights and gains values and (3) the determination of a new search direction by using information calculated in step (2). The performance of the proposed method is demonstrated by comparing accuracy and computation time with the conjugate gradient algorithm used in MATLAB neural network toolbox. The results show that the computational efficiency of the proposed method was better than the standard conjugate gradient algorithm. Keywords—Adaptive gain variation, back-propagation, activation function, conjugate gradient, search direction.
منابع مشابه
Handwritten Character Recognition using Modified Gradient Descent Technique of Neural Networks and Representation of Conjugate Descent for Training Patterns
The purpose of this study is to analyze the performance of Back propagation algorithm with changing training patterns and the second momentum term in feed forward neural networks. This analysis is conducted on 250 different words of three small letters from the English alphabet. These words are presented to two vertical segmentation programs which are designed in MATLAB and based on portions (1...
متن کاملClassification of ECG signals using Hermite functions and MLP neural networks
Classification of heart arrhythmia is an important step in developing devices for monitoring the health of individuals. This paper proposes a three module system for classification of electrocardiogram (ECG) beats. These modules are: denoising module, feature extraction module and a classification module. In the first module the stationary wavelet transform (SWF) is used for noise reduction of ...
متن کاملAn Improved Learning Algorithm based on the Conjugate Gradient Method for Back Propagation Neural Networks
The conjugate gradient optimization algorithm usually used for nonlinear least squares is presented and is combined with the modified back propagation algorithm yielding a new fast training multilayer perceptron (MLP) algorithm (CGFR/AG). The approaches presented in the paper consist of three steps: (1) Modification on standard back propagation algorithm by introducing gain variation term of th...
متن کاملGeoid Determination Based on Log Sigmoid Function of Artificial Neural Networks: (A case Study: Iran)
A Back Propagation Artificial Neural Network (BPANN) is a well-known learning algorithmpredicated on a gradient descent method that minimizes the square error involving the networkoutput and the goal of output values. In this study, 261 GPS/Leveling and 8869 gravity intensityvalues of Iran were selected, then the geoid with three methods “ellipsoidal stokes integral”,“BPANN”, and “collocation” ...
متن کاملSimulation of Narrow Band Speech Signal using BPN Networks
This paper proposes to extend the band width of narrow band telephone speech signal by employing feed forward back propagation neural network. There are different types of faster training algorithm are available in the literature like Variable Learning Rate, Resilient Back propagation, Polak-Ribiére Conjugate Gradient , Conjugate Gradient with Powell/Beale Restarts , BFGS Quasi-Newton , One-Ste...
متن کامل